Search results for "Mutual information"

showing 10 items of 66 documents

Assessing optimal water quality monitoring network in road construction using integrated information-theoretic techniques

2020

Author´s accepted manuscript. The environmental impacts of road construction on the aquatic environment necessitate the monitoring of receiving water quality. The main contribution of the paper is developing a feasible methodology for spatial optimization of the water quality monitoring network (WQMN) in surface water during road construction using the field data. First, using the Canadian Council of Ministers of the Environment (CCME) method, the water quality index (WQI) was computed in each potential monitoring station during construction. Then, the integrated form of the information-theoretic techniques consists of the transinformation entropy (TE), and the value of information (VOI) we…

010504 meteorology & atmospheric sciencesRoad constructionOperations researchComputer science0207 environmental engineeringAnalytic hierarchy processTOPSIS02 engineering and technologyIdeal solutionMutual informationVDP::Teknologi: 500::Bygningsfag: 530::Konstruksjonsteknologi: 53301 natural sciencesValue of informationEntropy (information theory)Water quality020701 environmental engineering0105 earth and related environmental sciencesWater Science and TechnologyJournal of Hydrology
researchProduct

Ultra-Fast Detection of Higher-Order Epistatic Interactions on GPUs

2017

Detecting higher-order epistatic interactions in Genome-Wide Association Studies (GWAS) remains a challenging task in the fields of genetic epidemiology and computer science. A number of algorithms have recently been proposed for epistasis discovery. However, they suffer from a high computational cost since statistical measures have to be evaluated for each possible combination of markers. Hence, many algorithms use additional filtering stages discarding potentially non-interacting markers in order to reduce the overall number of combinations to be examined. Among others, Mutual Information Clustering (MIC) is a common pre-processing filter for grouping markers into partitions using K-Means…

0301 basic medicineTheoretical computer scienceComputer sciencebusiness.industryContrast (statistics)Genome-wide association study02 engineering and technologyMutual informationMachine learningcomputer.software_genreReduction (complexity)03 medical and health sciences030104 developmental biologyGenetic epidemiology0202 electrical engineering electronic engineering information engineeringEpistasis020201 artificial intelligence & image processingArtificial intelligenceCluster analysisbusinesscomputerGenetic association
researchProduct

Lag-specific transfer entropy as a tool to assess cardiovascular and cardiorespiratory information transfer

2014

In the study of interacting physiological systems, model-free tools for time series analysis are fundamental to provide a proper description of how the coupling among systems arises from the multiple involved regulatory mechanisms. This study presents an approach which evaluates direction, magnitude, and exact timing of the information transfer between two time series belonging to a multivariate dataset. The approach performs a decomposition of the well-known transfer entropy (TE) which achieves 1) identifying, according to a lag-specific information-theoretic formulation of the concept of Granger causality, the set of time lags associated with significant information transfer, and 2) assig…

AdultMaleInformation transferMultivariate statisticsDynamical systems theoryDatabases FactualComputer sciencePhysiologyEntropyBiomedical EngineeringBlood Pressuredynamical systemYoung AdultGranger causalityControl theoryHumansAutonomic nervous systemmultivariate time serieTime seriesmutual informationcardiovascular controlconditional entropy (CE)RespirationModels CardiovascularComputational BiologyHeartMutual informationCausalityNonlinear systemSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaGranger causalityTransfer entropy
researchProduct

Comparison of discretization strategies for the model-free information-theoretic assessment of short-term physiological interactions

2023

This work presents a comparison between different approaches for the model-free estimation of information-theoretic measures of the dynamic coupling between short realizations of random processes. The measures considered are the mutual information rate (MIR) between two random processes [Formula: see text] and [Formula: see text] and the terms of its decomposition evidencing either the individual entropy rates of [Formula: see text] and [Formula: see text] and their joint entropy rate, or the transfer entropies from [Formula: see text] to [Formula: see text] and from [Formula: see text] to [Formula: see text] and the instantaneous information shared by [Formula: see text] and [Formula: see…

Applied MathematicsSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaGeneral Physics and AstronomyStatistical and Nonlinear PhysicsInformation-theoretic measures mutual information rate (MIR) binning permutation time-series analysisMathematical Physics
researchProduct

Optimal Sampling Period and Required Number of Samples for OSTBC-MIMO Rayleigh Fading Channel Capacity Simulators

2014

The purpose of this paper is to contribute to the performance assessment of channel capacity simulators. Here, we consider the instantaneous capacity (also referred to as the mutual information) in orthogonal space-time block code (OSTBC) transceiver systems over multiple-input multiple-output (MIMO) Rayleigh fading channels. To ensure that the level-crossing rate (LCR) of the instantaneous capacity can efficiently and accurately be simulated, we derive closed-form approximate solutions to the optimal sampling period and the required number of samples to be generated. Several numerical examples will be presented to illustrate the usefulness of our procedure. It will also be shown that the d…

Block codesymbols.namesakeSpatial correlationChannel capacityControl theoryMIMOsymbolsMutual informationTransceiverGaussian processComputer Science::Information TheoryRayleigh fadingMathematics2014 IEEE 80th Vehicular Technology Conference (VTC2014-Fall)
researchProduct

Functional connectivity inference from fMRI data using multivariate information measures

2022

Abstract Shannon’s entropy or an extension of Shannon’s entropy can be used to quantify information transmission between or among variables. Mutual information is the pair-wise information that captures nonlinear relationships between variables. It is more robust than linear correlation methods. Beyond mutual information, two generalizations are defined for multivariate distributions: interaction information or co-information and total correlation or multi-mutual information. In comparison to mutual information, interaction information and total correlation are underutilized and poorly studied in applied neuroscience research. Quantifying information flow between brain regions is not explic…

Brain MappingComputer scienceEntropyCognitive NeuroscienceConditional mutual informationBrainMultivariate normal distributionMutual informationcomputer.software_genreMagnetic Resonance ImagingInteraction informationRedundancy (information theory)Artificial IntelligenceEntropy (information theory)Computer SimulationTotal correlationInformation flow (information theory)Data miningcomputerNeural Networks
researchProduct

Mutual Information Analysis of Brain-Body Interactions during different Levels of Mental stress

2019

In this work, we analyze brain-heart interactions during different mental states computing mutual information (MI) between the dynamic activity of different physiological systems. In 18 healthy subjects monitored in a relaxed resting state and during a mental arithmetic and a serious game task, multichannel EEG, one lead ECG, respiration and blood volume pulse were collected via wireless non-invasive biosensors. From these signals, synchronous 300-second time series were extracted measuring brain activity via the δ, θ, α, and β EEG power, and activity of the body district via the ECG R-R interval η, the respiratory amplitude ϱ and the pulse arrival time π. MI was computed using a linear est…

Brain activity and meditationElectroencephalographynetwork physiology01 natural sciencesMeasure (mathematics)Settore ING-INF/01 - Elettronica030218 nuclear medicine & medical imaging03 medical and health sciences0302 clinical medicineHeart Rate0103 physical sciencesmedicineHumansEEG010306 general physicsmutual informationPhysicsBrain Mappingmedicine.diagnostic_testSeries (mathematics)Resting state fMRIPulse (signal processing)ECGMathematical analysisBrainElectroencephalographyMutual informationbrain-heart interactionAmplitudeSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaMathematicsStress Psychological
researchProduct

Mutual information-based feature selection for low-cost BCIs based on motor imagery

2016

In the present study a feature selection algorithm based on mutual information (MI) was applied to electro-encephalographic (EEG) data acquired during three different motor imagery tasks from two dataset: Dataset I from BCI Competition IV including full scalp recordings from four subjects, and new data recorded from three subjects using the popular low-cost Emotiv EPOC EEG headset. The aim was to evaluate optimal channels and band-power (BP) features for motor imagery tasks discrimination, in order to assess the feasibility of a portable low-cost motor imagery based Brain-Computer Interface (BCI) system. The minimal sub set of features most relevant to task description and less redundant to…

Brain-Computer InterfaceSupport Vector MachineDatabases FactualComputer scienceHeadsetSpeech recognitionFeature extractionBiomedical EngineeringReproducibility of ResultHealth InformaticsFeature selection02 engineering and technologyElectroencephalography03 medical and health sciences0302 clinical medicineMotor imagery0202 electrical engineering electronic engineering information engineeringmedicineHumans1707medicine.diagnostic_testbusiness.industryReproducibility of ResultsElectroencephalographyPattern recognitionMutual informationModels TheoreticalAlgorithmSupport vector machineBrain-Computer InterfacesSignal ProcessingSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaEidetic Imagery020201 artificial intelligence & image processingArtificial intelligencebusinessAlgorithms030217 neurology & neurosurgeryHuman2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)
researchProduct

A measure of concurrent neural firing activity based on mutual information

2020

AbstractMultiple methods have been developed in an attempt to quantify stimulus-induced neural coordination and to understand internal coordination of neuronal responses by examining the synchronization phenomena in neural discharge patterns. In this work we propose a novel approach to estimate the degree of concomitant firing between two neural units, based on a modified form of mutual information (MI) applied to a two-state representation of the firing activity. The binary profile of each single unit unfolds its discharge activity in time by decomposition into the state of neural quiescence/low activity and state of moderate firing/bursting. Then, the MI computed between the two binary st…

BurstingComputer sciencebusiness.industryPattern recognitionMutual informationArtificial intelligenceRepresentation (mathematics)businessRetinal ganglionMeasure (mathematics)Synchronization
researchProduct

Artificial intelligence for affective computing : an emotion recognition case study.

2020

This chapter provides an introduction on the benefits of artificial intelligence (Al) techniques for the field of affective computing, through a case study about emotion recognition via brain (electroencephalography EEG) signals. Readers are first pro-vided with a general description of the field, followed by the main models of human affect, with special emphasis to Russell's circumplex model and the pleasur-arousal-dominance (PAD) model. Finally, an AI-based method for the detection of affect elicited via multimedia stimuli is presented. The method combines both connectivity-and channel-based EEG features with a selection method that considerably reduces the dimensionality of the data and …

Channel (digital image)medicine.diagnostic_testLogarithmComputer sciencebusiness.industryFeature selectionMutual informationElectroencephalographyField (computer science)Frequency domainmedicineArtificial intelligenceAffective computingbusiness
researchProduct